A multimodal tempo and beat-tracking system based on audiovisual information from live guitar performances
نویسندگان
چکیده
The aim of this paper is to improve beat-tracking for live guitar performances. Beat-tracking is a function to estimate musical measurements, for example musical tempo and phase. This method is critical to achieve a synchronized ensemble performance such as musical robot accompaniment. Beat-tracking of a live guitar performance has to deal with three challenges: tempo fluctuation, beat pattern complexity and environmental noise. To cope with these problems, we devise an audiovisual integration method for beat-tracking. The auditory beat features are estimated in terms of tactus (phase) and tempo (period) by Spectro-Temporal Pattern Matching (STPM), robust against stationary noise. The visual beat features are estimated by tracking the position of the hand relative to the guitar using optical flow, mean shift and the Hough transform. Both estimated features are integrated using a particle filter to aggregate the multimodal information based on a beat location model and a hand’s trajectory model. Experimental results confirm that our beat-tracking improves the F-measure by 8.9 points on average over the Murata beat-tracking method, which uses STPM and rule-based beat detection. The results also show that the system is capable of real-time processing with a suppressed number of particles while preserving the estimation accuracy. We demonstrate an ensemble with the humanoid HRP-2 that plays the theremin with a human guitarist.
منابع مشابه
A Beat Tracking System for Audio Signals
We present a system which processes audio signals sampled from recordings of musical performances, and estimates the tempo at each point throughout the piece. The system employs a bottom-up approach to beat tracking from acoustic signals, assuming no a priori high-level knowledge of the music such as the time signature or approximate tempo, but rather deriving this information from the timing p...
متن کاملBeat Tracking with Musical Knowledge
When a person taps a foot in time with a piece of music, they are performing beat tracking. Beat tracking is fundamental to the understanding of musical structure, and therefore an essential ability for any system which purports to exhibit musical intelligence or understanding. We present an off-line multiple agent beat tracking system which estimates the locations of musical beats in MIDI perf...
متن کاملAutomatic Extraction of Tempo and Beat from Expressive Performances
We describe a computer program which is able to estimate the tempo and the times of musical beats in expressively performed music. The input data may be either digital audio or a symbolic representation of music such as MIDI. The data is processed off-line to detect the salient rhythmic events and the timing of these events is analysed to generate hypotheses of the tempo at various metrical lev...
متن کاملIBT: A Real-time Tempo and Beat Tracking System
This paper describes a tempo induction and beat tracking system based on the efficient strategy (initially introduced in the BeatRoot system [Dixon S., “Automatic extraction of tempo and beat from expressive performances.” Journal of New Music Research, 30(1):39-58, 2001]) of competing agents processing musical input sequentially and considering parallel hypotheses regarding tempo and beats. In...
متن کاملWhat Makes Beat Tracking Difficult? A Case Study on Chopin Mazurkas
The automated extraction of tempo and beat information from music recordings is a challenging task. Especially in the case of expressive performances, current beat tracking approaches still have significant problems to accurately capture local tempo deviations and beat positions. In this paper, we introduce a novel evaluation framework for detecting critical passages in a piece of music that ar...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- EURASIP J. Audio, Speech and Music Processing
دوره 2012 شماره
صفحات -
تاریخ انتشار 2012